Combining heterogeneous classifiers for stock selection

نویسندگان

  • George Albanis
  • Roy Batchelor
چکیده

Combining unbiased forecasts of continuous variables necessarily reduces the error variance below that of the median individual forecast. However, this does not necessarily hold for forecasts of discrete variables, or where the costs of errors are not directly related to the error variance. This paper investigates empirically the benefits of combining forecasts of outperforming shares, based on five linear and nonlinear statistical classification techniques, including neural network and recursive partitioning methods. We find that simple “Majority Voting” improves accuracy and profitability only marginally. Much greater gains come from applying the “Unanimity Principle”, whereby a share is not held in the high-performing portfolio unless all classifiers agree.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Boosting Localized Classifiers in Heterogeneous Databases

Combining multiple global models (e.g. back-propagation based neural networks) is an effective technique for improving classification accuracy. This technique reduces variance by manipulating the distribution of the training data. In many large scale data analysis problems involving heterogeneous databases with attribute instability, standard boosting methods can be improved by coalescing multi...

متن کامل

Feature selection using genetic algorithm for breast cancer diagnosis: experiment on three different datasets

Objective(s): This study addresses feature selection for breast cancer diagnosis. The present process uses a wrapper approach using GA-based on feature selection and PS-classifier. The results of experiment show that the proposed model is comparable to the other models on Wisconsin breast cancer datasets. Materials and Methods: To evaluate effectiveness of proposed feature selection method, we ...

متن کامل

استفاده از یادگیری همبستگی منفی در بهبود کارایی ترکیب شبکه های عصبی

This paper investigates the effect of diversity caused by Negative Correlation Learning(NCL) in the combination of neural classifiers and presents an efficient way to improve combining performance. Decision Templates and Averaging, as two non-trainable combining methods and Stacked Generalization as a trainable combiner are investigated in our experiments . Utilizing NCL for diversifying the ba...

متن کامل

Comparison of Classifier Selection Methods for Improving Committee Performance

Combining classifiers is an effective way of improving classification performance. In many situations it is possible to construct several classifiers with different characteristics. Selecting the member classifiers with the best individual performance can be shown to be suboptimal in several cases, and hence there exists a need to attempt to find effective member classifier selection methods. I...

متن کامل

Evaluation of Ensemble Classifiers for Handwriting Recognition

One of the major developments in machine learning in the past decade is the ensemble method, which finds highly accurate classifier by combining many moderately accurate component classifiers. In this research work, new ensemble classification methods are proposed for homogeneous ensemble classifiers using bagging and heterogeneous ensemble classifiers using arcing classifier and their performa...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Int. Syst. in Accounting, Finance and Management

دوره 15  شماره 

صفحات  -

تاریخ انتشار 2007